Regularized Nonparametric Logistic Regression and Kernel Regularization 1

نویسنده

  • Fan Lu
چکیده

Regularization methods consist of a category of commonly used techniques to obtain robust solutions to ill-posed problems such as nonparametric regression and classification. In recent years, methods of regularization have also been successfully introduced to address some other classical problems in statistics, e.g. model/variable selection and dimension reduction. This thesis is composed of two major parts, both of which are within the framework of regularization methods. In the first part of this thesis, we are interested in the physics problem of detecting high energy signal neutrino events. We propose a modification to the traditional nonparametric penalized likelihood approach, to take into account the usage of importance sampling techniques in the generation of the training data from computer experiments. We try to estimate the multivariate logit function of the signal neutrino events in order to find the most powerful decision boundary at a certain significance level to optimally separate signal from background neutrinos. For simulated normal data, we compare this approach with a non-standard support vector machine (SVM) approach. The results suggest that in the case of weighted binary data, logistic regression is more appropriate than SVM in terms of finding individual level curves of the logit function. We also propose a diagnostic plot to check the goodness of fit of the result when

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning low-rank output kernels

Output kernel learning techniques allow to simultaneously learn a vector-valued function and a positive semidefinite matrix which describes the relationships between the outputs. In this paper, we introduce a new formulation that imposes a low-rank constraint on the output kernel and operates directly on a factor of the kernel matrix. First, we investigate the connection between output kernel l...

متن کامل

Learning Rates of Least-Square Regularized Regression

This paper considers the regularized learning algorithm associated with the leastsquare loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and the capacity of the reproducing kernel Hilbert...

متن کامل

On the estimation of transfer functions, regularizations and Gaussian processes - Revisited

Intrigued by some recent results on impulse response estimation by kernel and nonparametric techniques, we revisit the old problem of transfer function estimation from input-output measurements. We formulate a classical regularization approach, focused on finite impulse response (FIR) models, and find that regularization is necessary to cope with the high variance problem. This basic, regulariz...

متن کامل

Kernel methods and regularization techniques for nonparametric regression: Minimax optimality and adaptation

Regularization is an essential element of virtually all kernel methods for nonparametric regression problems. A critical factor in the effectiveness of a given kernel method is the type of regularization that is employed. This article compares and contrasts members from a general class of regularization techniques, which notably includes ridge regression and principal component regression. We f...

متن کامل

Hyperparameter optimization with approximate gradient

Most models in machine learning contain at least one hyperparameter to control for model complexity. Choosing an appropriate set of hyperparameters is both crucial in terms of model accuracy and computationally challenging. In this work we propose an algorithm for the optimization of continuous hyperparameters using inexact gradient information. An advantage of this method is that hyperparamete...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006